Improved Multidimensional Scaling Analysis Using Neural Networks with Distance-Error Backpropagation

نویسندگان

  • Luií Garrido
  • Sergio Gómez
  • Jaume Roca
چکیده

We show that neural networks, with a suitable error function for backpropagation, can be successfully used for metric multidimensional scaling (MDS) (i.e., dimensional reduction while trying to preserve the original distances between patterns) and are in fact able to outdo the standard algebraic approach to MDS, known as classical scaling.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neural Network Image Scaling Using Spatial Errors

We propose a general method for gradient-based training of neural network (NN) models to scale multidimensional signal data. In the case of image data, the goal is to fit models that produce images of high perceptual quality, as opposed to simply a high peak signal to noise ratio (PSNR). There have been a number of perceptual image error measures proposed in the literature, the majority of whic...

متن کامل

Investigation of Mechanical Properties of Self Compacting Polymeric Concrete with Backpropagation Network

Acrylic polymer that is highly stable against chemicals and is a good choice when concrete is subject to chemical attack. In this study, self-compacting concrete (SCC) made using acrylic polymer, nanosilica and microsilica has been investigated. The results of experimental testing showed that the addition of microsilica and acrylic polymer decreased the tensile, compressive and bending strength...

متن کامل

Feature scaling in support vector data description

When in a classification problem only samples of one class are easily accessible, this problem is called a one-class classification problem. Many standard classifiers, like backpropagation neural networks, fail on this data. Some other classifiers, like k-means clustering or nearest neighbor classifier can be applied after some minor changes. In this paper we focus on the support vector data de...

متن کامل

Improved learning of Riemannian metrics for exploratory analysis

We have earlier introduced a principle for learning metrics, which shows how metric-based methods can be made to focus on discriminative properties of data. The main applications are in supervising unsupervised learning to model interesting variation in data, instead of modeling all variation as plain unsupervised learning does. The metrics are derived by approximations to an information-geomet...

متن کامل

Random Walk Initialization for Training Very Deep Feedforward Networks

Training very deep networks is an important open problem in machine learning. One of many difficulties is that the norm of the back-propagated error gradient can grow or decay exponentially. Here we show that training very deep feed-forward networks (FFNs) is not as difficult as previously thought. Unlike when backpropagation is applied to a recurrent network, application to an FFN amounts to m...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 11 3  شماره 

صفحات  -

تاریخ انتشار 1999